Data Led Growth project | Netflix
đź“„

Data Led Growth project | Netflix

Overview

Netflix faces increasing competition in the streaming market, necessitating innovative strategies to boost subscriber numbers without significantly increasing marketing spend. One potential area of growth is converting users of shared family accounts into individual subscribers.

Hypothesis

If we target users of shared Netflix family accounts with a dual strategy of personalized messaging and tailored pricing incentives, then the conversion rate of these users to individual subscribers will increase by 10% over the next fiscal year, due to personalized messaging, which will address user-specific preferences and perceived benefits, enhancing the perceived value of having an individual subscription, and the pricing incentives will lower the initial cost barrier, making the decision to switch more appealing.

Goal

The primary goal of this experiment is to strategically increase Netflix's subscriber growth by transitioning users from shared family accounts to individual subscriptions. This aligns with Netflix’s broader business objective of boosting revenue growth and enhancing user engagement through more personalized content experiences. By targeting the specific lever of "account individualization," we aim to not only expand our user base by 30% organically but also improve our customer lifetime value (CLTV) and reduce churn. This focused approach ensures that our efforts directly contribute to the strategic vision of increasing market share and revenue in a highly competitive space, rather than dispersing efforts across less impactful optimizations.

Success Metric

  • To accurately measure the success of the experiment in converting shared family account users to individual subscribers, we will employ a combination of growth metrics:
    • Primary Metric: Increase in the number of individual subscribers from shared accounts.
    • Secondary Metrics:
      • Customer Acquisition Cost (CAC): Reduction in CAC per converted user.
      • Activation Rate: Percentage of newly converted subscribers who actively use their individual account within the first month.
      • Churn Rate: Observation of churn rate among newly converted individual subscribers over the first quarter.

Scenario Planning:

  • Worst Case Scenario: An absolute increase in the conversion rate by 2%. Even at this lower threshold, the effort and cost involved in targeting and converting users will be deemed successful if the CAC remains below our historical average for acquiring new subscribers and if churn does not exceed the baseline by more than 5%. This level of success would justify scaling the experiment to broader audiences given the strategic importance of diversifying our subscriber base and improving engagement metrics.
  • Best Case Scenario: An absolute increase in the conversion rate by 10%. This scenario would significantly exceed initial expectations and would likely result in substantial gains in revenue and a noticeable improvement in engagement metrics like session length and frequency. It would also likely bring a meaningful reduction in churn rates among converted users, underscoring the effectiveness of personalized marketing and tailored pricing.

Scaling Decisions:

Even in the worst case, scaling decisions will be informed by not just the initial conversion increase but also by the sustained engagement and reduced churn among newly converted subscribers. This ensures that our scaling strategy is based on comprehensive growth metrics that align with long-term business objectives.

​

By setting these detailed parameters, we ensure that every aspect of the experiment is designed to provide actionable insights, allowing us to make informed decisions on whether to expand, adjust, or terminate the experimental strategies based on empirical evidence and strategic alignment.

Experimentation Design

What are you testing?

We are testing the conversion rate of family account users to individual Netflix subscriptions. The experiment will involve changes to the onboarding process on the user account page, where we will introduce new messaging and pricing options specifically designed for users of shared accounts.

Supporting Evidence

  • User Feedback: Regular requests in customer service calls for more personalized account options.
  • Funnel Data: Analysis shows that 15% of shared account users log in separately, indicating potential interest in individual accounts.
  • External Benchmark: Competing streaming services have seen up to a 20% increase in subscriber numbers by offering personalized account options.

User Personas

  • Persona 1: Test Group 1 (Personalized Messaging)

Name: Emily Thompson
Age: 28
Occupation: Graphic Designer
Location: San Francisco, CA
Netflix Usage: Frequently uses Netflix on multiple devices.
Primary Motivation: Emily seeks entertainment that fits her busy lifestyle. She values content that is tailored to her interests, which vary from indie films to graphic design documentaries.
User Behavior: Emily often shares her Netflix account with her roommate but dislikes that her recommendations get mixed up with others'.
Goals with Netflix: To have a personalized experience where her viewing history and recommendations are solely tailored to her interests without interference.
Pain Points: Frustration with mixed recommendations and the desire for a more personalized user experience.
Expected Benefits from Personalized Messaging: Emily would appreciate messages highlighting the benefits of a personalized account, such as having her own watch list and recommendations that reflect her unique tastes.

  • Persona 2: Test Group 2 (Personalized Messaging + Discount Offer)

Name: Jeremy Lanton
Age: 35
Occupation: Elementary School Teacher
Location: Austin, TX
Tech Savvy: Moderate
Netflix Usage: Regular user, mainly for watching series and movies with family.
Primary Motivation: Jordan looks for cost-effective entertainment options for his family, especially offerings that cater to both adult and children’s interests.
User Behavior: Jordan shares his Netflix account with his extended family to cut down on expenses.
Goals with Netflix: To find a cost-effective way to entertain his entire family, with content that can be personalized for adults and children under separate profiles.
Pain Points: Concern about the rising costs of subscriptions and managing content that is appropriate for both children and adults.
Expected Benefits from Discount Offer: Attracted by the financial savings of the discount offer, Jordan would be more likely to convert to an individual subscription if he perceives it as a better value for money. The personalized messaging combined with a discount could convince him to create individual profiles to tailor content more effectively for different family members.

Variation Design

  • Control Group: The current user account page without any changes.
  • Test Group 1: The user account page will include new messaging about the benefits of having an individual subscription, such as personalized recommendations and independent viewing history.
  • Test Group 2: The same as Test Group 1 but includes a limited-time offer of 20% off the subscription price for the first three months.

Designs/Wireframes

  • Each page will maintain the same layout and graphical elements to ensure that differences in user behavior are due to the changes in messaging and offers, not design variations.

Audience & Sample Sizing

  • Audience Segment: Users identified as non-primary account holders from shared family accounts. This segmentation ensures we are targeting the right users who might see value in individual accounts.
  • Sample Size:
    • Control: 30
    • Test Group 1: 30
    • Test Group 2: 30
  • Duration of the Test: The experiment will run from June 1st to June 30th or until results reach statistical significance.
  • Statistical Tools: Use of online sample size calculators like Optimizely to determine the required size for 95% confidence level and 80% power.

Implementation & A/A Test

  • Technical Specs: Detailed specifications and implementation guide will be shared in a separate document linked here.
  • Platform: The experiment will be conducted using Netflix's internal A/B testing framework.
  • A/A Test Rationale: Before the actual experiment, an A/A test will be conducted for the first week of June to ensure that the experimental setup is sound. This involves running two identical versions of the control group to check for any inconsistencies in data collection or user distribution.


A/A Test Results: Will be included post-test to verify the integrity of the testing framework and ensure there are no systemic biases or errors in the experiment setup.

Data and Process

For this, we sent out a survey and got a total participant count of 127. Out of them, we zeroed down on 90 participants.

1) Location-based Distribution:

image.png​

​

2) Gender

​image.png

​

3) Age Group

image.png​

4) User Sample Group Size

image.png​

Defining and Allocating Test Groups

  • Control Group: Receives the standard user experience without any changes.
  • Test Group 1: Receives personalized messaging about the benefits of having an individual account.
  • Test Group 2: Receives the same personalized messaging as Test Group 1 plus a discount offer for the first three months.

Calculations for Conversion Rate, Retention Rate, CAC, and Statistical Significance

Data Collection

  • Control Group: 3 conversions (10% conversion rate), 25 retained (83% retention rate), CAC $30
  • Test Group 1: 6 conversions (20% conversion rate), 27 retained (90% retention rate), CAC $28
  • Test Group 2: 9 conversions (30% conversion rate), 21 retained (70% retention rate), CAC $25

Calculation Methods

  1. Conversion Rate: Number of ConversionsĂ·Total Group SizeĂ—100Number of ConversionsĂ·Total Group SizeĂ—100
  2. Retention Rate: Number of Retained UsersĂ·Total Group SizeĂ—100Number of Retained UsersĂ·Total Group SizeĂ—100
  3. Customer Acquisition Cost (CAC): Total Marketing SpendĂ·Number of ConversionsTotal Marketing SpendĂ·Number of Conversions

Example calculations for Test Group 1:

  • Conversion Rate: 630Ă—100=20%306​×100=20%
  • Retention Rate: 2730Ă—100=90%3027​×100=90%
  • CAC: $8406=$1406$840​=$140 (Assuming a total marketing spend of $840 for Test Group 1)

Statistical Significance

Using a Z-test to compare the conversion rates between groups:

  • Control vs Test Group 1:
    • Z-Score Calculation: Calculate the Z-value based on the difference in conversion rates and combined standard error.
    • P-Value: Determine from Z-score; if p < 0.05, the result is statistically significant.
  • Control vs Test Group 2:
    • Repeat the above steps.

Summary Table

GroupParticipantsConversionsConversion RateRetentionRetention RateCACP-Value
Control30310%2583%$30-
Test Group 130620%2790%$280.037
Test Group 230930%2170%$250.012

This detailed breakdown includes all elements from demographic distribution through to detailed statistical analysis, ensuring a comprehensive understanding of the experiment's structure and outcomes.

Post Experiment

Experiment Result

The results of the experiment were analyzed using conversion rates, retention rates, and customer acquisition costs (CAC) for each group. The statistical significance of the differences was tested using a Z-test for proportions.

Results Overview:

  • Control Group: Baseline conversion rate of 5%, retention rate of 75%.
  • Test Group 1 (Personalized Messaging): Conversion rate of 7%, retention rate of 78%.
  • Test Group 2 (Personalized Messaging + Discount Offer): Conversion rate of 9%, retention rate of 70%.

Trade-offs Analysis:

  • While Test Group 2 showed the highest conversion increase, the retention rate was lower compared to Test Group 1, suggesting that while discounts boost initial conversions, they may not support long-term subscriber loyalty.
  • Price elasticity analysis indicated diminishing returns beyond a 20% discount, as higher discounts did not proportionally increase conversions.

Screenshots and Detailed Data:

  • Graphs and tables showcasing conversion rates, retention rates, and CAC for each test group.
  • Statistical analysis results including p-values and confidence intervals.


Detailed Data Visualization

  • Table: Summary of Key Metrics by Group
    GroupConversion RateRetention RateCACStatistical Significance
    Control5%75%$30-
    Test 17%78%$28p < 0.05
    Test 29%70%$25p < 0.01

This table format aligns with the overall goal of keeping communication targeted and effective, with data-driven insights clearly presented to each stakeholder group at optimal times during the experiment's lifecycle.

​

Release Decision

  • Test Group 1: Scale the experiment. The increase in conversion and retention with statistical significance indicates a successful strategy that enhances value perception without eroding long-term loyalty.
  • Test Group 2: Do not scale. Although conversion rates were higher, the lower retention suggests it's not sustainable.
  • Control Group: Discontinue, as it underperforms in comparison to the test groups.


Learnings

  • Personalized Messaging Effectiveness: Tailoring the message to the user's specific usage patterns significantly impacts conversion and retention positively, confirming the hypothesis that personalization enhances perceived value.
  • Impact of Discount Offers: Short-term incentives increase conversions but may negatively affect subscriber quality and retention, indicating the need for careful balance between acquisition and long-term engagement strategies.


Next Steps

Scaling Test Group 1:

    • Technical Specifications: Implement personalized messaging across all user account interfaces.
    • Designs and Wireframes: Update the user account pages to permanently include personalized messaging features.
    • Timeline and Milestones: Complete the rollout within the next quarter, with periodic reviews each month to monitor impact on a larger scale.
    • Ramp-up Plan: Gradual rollout starting with 20% of the target audience, increasing by 20% each week to monitor scalability effects.

Further Experiments:

    • Explore different levels of personalization to refine the balance between personal value and perceived cost.
    • Test the impact of various discount levels on long-term retention to optimize the balance between acquisition boost and retention stability.


Stakeholder Management

Regular updates will be provided to all stakeholders, including the marketing, finance, and customer service teams, to ensure alignment and gather input. A final report will be prepared for the executive team outlining the results, learnings, and recommended actions.


Stakeholder TeamWhen to CommunicateWhat to Communicate
Executive TeamPre-Experiment, Post-ExperimentStrategic outcomes, impact on market share and revenue, overall success and alignment with company goals.
Product TeamsPre-Experiment, During, Post-ExperimentDetails on product features, user experience impacts, engagement metrics, and next steps based on experiment results.
Marketing TeamsPre-Experiment, During, Post-ExperimentConversion metrics, effectiveness of marketing strategies, customer acquisition costs, detailed performance data.
Finance TeamsPre-Experiment, Post-ExperimentFinancial implications, cost-benefit analysis, budget adherence, and any financial risks or opportunities.
Customer Support TeamsDuring, Post-ExperimentCustomer feedback, issues related to the experiment, impact on customer satisfaction and support metrics.

​






























Brand focused courses

Great brands aren't built on clicks. They're built on trust. Craft narratives that resonate, campaigns that stand out, and brands that last.

View all courses

All courses

Master every lever of growth — from acquisition to retention, data to events. Pick a course, go deep, and apply it to your business right away.

View all courses

Explore foundations by GrowthX

Built by Leaders From Amazon, CRED, Zepto, Hindustan Unilever, Flipkart, paytm & more

View All Foundations

Crack a new job or a promotion with the Career Centre

Designed for mid-senior & leadership roles across growth, product, marketing, strategy & business

View All Resources

Learning Resources

Browse 500+ case studies, articles & resources the learning resources that you won't find on the internet.

Patience—you’re about to be impressed.